Nick Bostrom On Existential Risks

Monday, July 9, 2012

Nick Bostrom

 Future of Humanity
'Existential risk is something that has never happened in all of human history. It is defined as something that either causes the extinction of earth ridden or intelligent life or that permanently and drastically will destroy our future potential for desirable development.' says Professor Nick Bostrom. According to him, preventing these risks is humanity's greatest challenge.
The topic of existential risk has captivated professor Nick Bostrom as Director of Oxford University's Future of Humanity Institute for many years.

Bostrom is the founding Director of the Future of Humanity Institute and of the Programme on the Impacts of Future Technology within the Oxford Martin School. He is the author of some 200 publications, including Anthropic Bias, Global Catastrophic Risks, and Human Enhancement, and a forthcoming book on Superintelligence.



He previously taught at Yale, and he was a Postdoctoral Fellow of the British Academy. Bostrom has a background in physics, computational neuroscience, and mathematical logic as well as philosophy. He is best known for his work in five areas: (i) the concept of existential risk; (ii) the simulation argument; (iii) anthropics (developing the first mathematically explicit theory of observation selection effects); (iv) transhumanism, including related issues in bioethics and on consequences of future technologies; and (v) foundations and practical implications of consequentialism (Astronomical Waste, Infinite Ethics, Technological Revolutions).

intelligence explosion
Image Source: intelligence explosion.com

Bostrom is currently working on a book on the possibility of an intelligence explosion and on the existential risks and strategic issues related to the prospect of machine superintelligence.

In the video below, Bostrom of the University of Oxford talks to Global Policy's Dr. Jill Stuart about the possibility of existential threat and how academics and practitioners can begin to think about how to tackle these risks.

'Existential risk is something that has never happened in all of human history. It is defined as something that either causes the extinction of earth ridden or intelligent life or that permanently and drastically will destroy our future potential for desirable development.' .




SOURCE  Global Policy Challenge

By 33rd SquareSubscribe to 33rd Square


0 comments:

Post a Comment